Welcome everybody to pattern recognition. So today we want to talk a bit about
feature transforms and in particular think about some ideas how to incorporate
class information into such a transform.
So the path that we're going towards is the discriminant analysis and this is
essentially one way how we can think about using classes and feature
transforms. So we remember that if we do discriminant analysis we do
discriminative modeling and this means that we want to decompose the posterior
in the factorization where we use the class priors and the class conditionals.
So you see that using the Bayes theorem and marginalization we were able to
decompose it into the following observations. Now what we will do in the
following is we will choose a specific form of distributions for our
probabilities and in particular we want to use the Gaussian distribution for
modeling our class conditionals. So you remember that the Gaussian probability
density function is given by a mean and a covariance matrix. Here we look into the
class conditionals which means that the means and the covariance matrices depend
on the class y so this is why they have the respective index and we remember
that this is the formulation for the Gaussian probability and of course when
you're going to the exam then everybody should of course be able to write this
one down. So you have the feature vector, the means for the class and then
also the covariance matrices and they are positive, definite. So please
remember those properties. You also remember some of the facts about the
Gaussian classifiers if we try to model the decision boundary and everything is
Gaussian. So the two classes that we've been considered are Gaussian then we
will have a quadratic decision boundary if they share the same covariance the
decision boundary is going to be linear in the component of X i of the feature
vector. So also we have seen the Naive Bayes approach and the Naive Bayes mapped
onto a Gaussian classifier will essentially result in diagonal covariance
matrices. Also if we would only have a single covariance matrix for all the
classes and if the priors are identical then the classification is essentially
just a minimization of the so-called Mahalanobis distance. So this is
essentially a distance where you measure the difference to the class centers and
you weigh the distance with respect to the inverse of the covariance matrix.
Then if we simplify this further and if we had an identity matrix for the
covariance matrix then we would essentially just look for the nearest
neighbor. So here the classifier would then simply compute the distance to the
class centers and then select the class center or prototype vector for the class
which has the lowest distance. So this would be just simply an L2 or Euclidean
distance neighboring approach. There is also ways how to somehow incorporate a
little bit of additional information in the covariance matrices so we can
essentially switch from linear to quadratic decision boundaries by mixing
the modeling approach. So you could model the total covariance here in sigma or
you could have the classwise covariance that is then sigma y and we could
introduce a mixing factor alpha that is essentially given between 0 and 1 and
this would allow us to switch between classwise modeling and global modeling.
So if we have alpha equals 0 we would end up with the global modeling and this
would yield a linear decision boundary and if we choose alpha to be 1 then we
would get the quadratic decision boundary so we can essentially start
mixing the two. Now let's think about the implications on feature transforms and
the question that we want to ask today is can we find a feature transform that
generates feature vectors so a transform of our vectors X and this transform we
Presenters
Zugänglich über
Offener Zugang
Dauer
00:13:17 Min
Aufnahmedatum
2020-10-30
Hochgeladen am
2020-10-30 12:26:55
Sprache
en-US
In this video, we start introducing discriminant transforms and look at their basic concept.
This video is released under CC BY 4.0. Please feel free to share and reuse.
For reminders to watch the new video follow on Twitter or LinkedIn. Also, join our network for information about talks, videos, and job offers in our Facebook and LinkedIn Groups.
Music Reference: Damiano Baldoni - Thinking of You